Reading all of these articles made it seem like employers view a college education kind of the same way. Back then, some of the fields that did not require a college degree, require one now. But why? I feel like having a college degree will make it seem like potential employees know more about their field of study, however, that is not the case. People can get hands-on training in that same field and be just as good as a person with a college education.
